West Coast of the United States - Wikipedia, the free encyclopedia The West Coast (or Pacific Coast) is the term for the westernmost coastal states of the United States. The term most ...